Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Distillation Training
# Distillation Training
Deit Small Distilled Patch16 224
Apache-2.0
The distilled DeiT model was pre-trained and fine-tuned on ImageNet-1k at 224x224 resolution, learning from a teacher CNN using distillation methods
Image Classification
Transformers
D
facebook
2,253
6
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase